540 research outputs found

    Split-Recovery: A Constitutional Answer to the Punitive Damage Dilemma

    Get PDF

    Combining Solution Reuse and Bound Tightening for Efficient Analysis of Evolving Systems

    Get PDF
    Software engineers have long employed formal verification to ensure the safety and validity of their system designs. As the system changes—often via predictable, domain-specific operations—their models must also change, requiring system designers to repeatedly execute the same formal verification on similar system models. State-of-the-art formal verification techniques can be expensive at scale, the cost of which is multiplied by repeated analysis. This paper presents a novel analysis technique—implemented in a tool called SoRBoT—which can automatically determine domain-specific optimizations that can dramatically reduce the cost of repeatedly analyzing evolving systems. Different from all prior approaches, which focus on either tightening the bounds for analysis or reusing all or part of prior solutions, SoRBoT’s automated derivation of domain-specific optimizations combines the benefits of both solution reuse and bound tightening while avoiding the main pitfalls of each. We experimentally evaluate SoRBoT against state-of-the-art techniques for verifying evolving specifications, demonstrating that SoRBoT substantially exceeds the run time performance of those state-of-the-art techniques while introducing only a negligible overhead, in contrast to the expensive additional computations required by the state-of-the-art verification techniques

    Parasol: Efficient Parallel Synthesis of Large Model Spaces

    Get PDF
    Formal analysis is an invaluable tool for software engineers, yet state-of-the-art formal analysis techniques suffer from well-known limitations in terms of scalability. In particular, some software design domains—such as tradeoff analysis and security analysis—require systematic exploration of potentially huge model spaces, which further exacerbates the problem. Despite this present and urgent challenge, few techniques exist to support the systematic exploration of large model spaces. This paper introduces Parasol, an approach and accompanying tool suite, to improve the scalability of large-scale formal model space exploration. Parasol presents a novel parallel model space synthesis approach, backed with unsupervised learning to automatically derive domain knowledge, guiding a balanced partitioning of the model space. This allows Parasol to synthesize the models in each partition in parallel, significantly reducing synthesis time and making large-scale systematic model space exploration for real-world systems more tractable. Our empirical results corroborate that Parasol substantially reduces (by 460% on average) the time required for model space synthesis, compared to state-of-the-art model space synthesis techniques relying on both incremental and parallel constraint solving technologies as well as competing, non-learning-based partitioning methods

    GENE SET TESTING TO CHARACTERIZE MULTIVARIATELY DIFFERENTIALLY EXPRESSED GENES

    Get PDF
    In a gene expression experiment (using oligo array, RNA-Seq, or other platform), researchers typically seek to characterize di erentially expressed genes based on common gene function or pathway involve-ment. The eld of gene set testing provides numerous characterization methods, some of which have proven to be more valid and powerful than others. These existing gene set testing methods focus on experimental designs where there is a single null hypothesis (usually involving association with a continuous or categorical phenotype) for each gene. Increasingly common experimental designs lead to multiple null hypotheses for each gene, and the characterization of these multivariately di erentially expressed genes is of great interest. We explore extensions of existing gene set testing methods to achieve this characterization, with application to a RNA-Seq study in livestock cloning

    A Shortcut for Multiple Testing on the Directed Acyclic Graph of Gene Ontology

    Get PDF
    Background: Gene set testing has become an important analysis technique in high throughput microarray and next generation sequencing studies for uncovering patterns of differential expression of various biological processes. Often, the large number of gene sets that are tested simultaneously require some sort of multiplicity correction to account for the multiplicity effect. This work provides a substantial computational improvement to an existing familywise error rate controlling multiplicity approach (the Focus Level method) for gene set testing in high throughput microarray and next generation sequencing studies using Gene Ontology graphs, which we call the Short Focus Level. Results: The Short Focus Level procedure, which performs a shortcut of the full Focus Level procedure, is achieved by extending the reach of graphical weighted Bonferroni testing to closed testing situations where restricted hypotheses are present, such as in the Gene Ontology graphs. The Short Focus Level multiplicity adjustment can perform the full top-down approach of the original Focus Level procedure, overcoming a significant disadvantage of the otherwise powerful Focus Level multiplicity adjustment. The computational and power differences of the Short Focus Level procedure as compared to the original Focus Level procedure are demonstrated both through simulation and using real data. Conclusions: The Short Focus Level procedure shows a significant increase in computation speed over the original Focus Level procedure (as much as ∼15,000 times faster). The Short Focus Level should be used in place of the Focus Level procedure whenever the logical assumptions of the Gene Ontology graph structure are appropriate for the study objectives and when either no a priori focus level of interest can be specified or the focus level is selected at a higher level of the graph, where the Focus Level procedure is computationally intractable

    Game-theoretic Analysis of Effort Allocation of Contributors to Public Projects

    Get PDF
    Public projects can succeed or fail for many reasons such as the feasibility of the original goal and coordination among contributors. One major reason for failure is that insufficient work leaves the project partially completed. For certain types of projects anything short of full completion is a failure (e.g., feature request on software projects in GitHub). Therefore, project success relies heavily on individuals allocating sufficient effort. When there are multiple public projects, each contributor needs to make decisions to best allocate his/her limited effort (e.g., time) to projects while considering the effort allocation decisions of other strategic contributors and his/her parameterized utilities based on values and costs for the projects. In this paper, we introduce a game-theoretic effort allocation model of contributors to public projects for modeling effort allocation of strategic contributors. We study the related Nash equilibrium (NE) computational problems and provide NP-hardness results for the existence of NE and polynomial-time algorithms for finding NE in restricted settings. Finally, we investigate the inefficiency of NE measured by the price of anarchy and price of stability. [Includes Supplementary Material.

    Patterns of fish use and piscivore abundance within a reconnected saltmarsh impoundment in the northern Indian River Lagoon, Florida

    Get PDF
    Nearly all saltmarshes in east-central, Florida were impounded for mosquito control during the 1960s. The majority of these marshes have since been reconnected to the estuary by culverts, providing an opportunity to effectively measure exchange of aquatic organisms. A multi-gear approach was used monthly to simultaneously estimate fish standing stock (cast net), fish exchange with the estuary (culvert traps), and piscivore abundance (gill nets and bird counts) to document patterns of fish use in a reconnected saltmarsh impoundment. Changes in saltmarsh fish abundance, and exchange of fish with the estuary reflected the seasonal pattern of marsh flooding in the northern Indian River Lagoon system. During a 6-month period of marsh flooding, resident fish had continuous access to the marsh surface. Large piscivorous fish regularly entered the impoundment via creeks and ditches to prey upon small resident fish, and piscivorous birds aggregated following major fish movements to the marsh surface or to deep habitats. As water levels receded in winter, saltmarsh fish concentrated into deep habitats and emigration to the estuary ensued (200% greater biomass left the impoundment than entered). Fish abundance and community structure along the estuary shoreline (although fringed with marsh vegetation) were not analogous to marsh creeks and ditches. Perimeter ditches provided deep-water habitat for large estuarine predators, and shallow creeks served as an alternative habitat for resident fish when the marsh surface was dry. Use of the impoundment as nursery by transients was limited to Mugil cephalus Linnaeus, but large juvenile and adult piscivorous fish used the impoundment for feeding. In conclusion, the saltmarsh impoundment was a feeding site for piscivorous fish and birds, and functioned as a net exporter of forage fish to adjacent estuarine waters

    Statistical Methods for Assessing Individual Oocyte Viability Through Gene Expression Profiles

    Get PDF
    In vivo derived oocytes are held as the gold standard for viability, other known origination methods are sub-par by comparison. Due to the low-viability of oocytes originating from these alternate methods, research was conducted to determine and quantify the validity of these alternate origination methods. However, the larger question of viability is on the individual oocyte level. We propose and compare methods of measurement based on gene expression profiles (GEPs) in order to assess oocyte viability, independent of oocyte origin. The first is based on a previously published wRMSD quantification of GEP differences. We also consider three novel methods: a distance comparison method, a tolerance interval method, and a classification-tree decision method; each utilizes a variable selection technique that focuses on the most differentially expressed genes. In our project, we obtain GEPs of individual swine oocytes and a general GEP distribution for in vivo oocytes. This distribution was the comparison standard for all oocytes, to gain a classification of viability. Each method is a valid method for driving viability decisions of the individual oocytes

    Higher fundamental frequency in bonobos is explained by larynx morphology

    Get PDF
    Acoustic signals, shaped by natural and sexual selection, reveal ecological and social selection pressures [1]. Examining acoustic signals together with morphology can be particularly revealing. But this approach has rarely been applied to primates, where clues to the evolutionary trajectory of human communication may be found. Across vertebrate species, there is a close relationship between body size and acoustic parameters, such as formant dispersion and fundamental frequency (f0). Deviations from this acoustic allometry usually produce calls with a lower f0 than expected for a given body size, often due to morphological adaptations in the larynx or vocal tract [2]. An unusual example of an obvious mismatch between fundamental frequency and body size is found in the two closest living relatives of humans, bonobos (Pan paniscus) and chimpanzees (Pan troglodytes). Although these two ape species overlap in body size [3], bonobo calls have a strikingly higher f0 than corresponding calls from chimpanzees [4]. Here, we compare acoustic structures of calls from bonobos and chimpanzees in relation to their larynx morphology. We found that shorter vocal fold length in bonobos compared to chimpanzees accounted for species differences in f0, showing a rare case of positive selection for signal diminution in both bonobo sexes
    • …
    corecore